-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revert MAP implementation to pycocotools
backend
#1832
Conversation
can we rather preserve both so have |
Yes, we could do that, but then I would prefer we just add a |
yes, that is also a good option. Just feel we have invested lots of effort to have our own implementation, so dropping it... :( also this would unblock users with exactly coco expectations (even if there are some known bugs) |
We should not keep it just because we have invested time in it, that is just a sunk cost fallacy. We should keep it because we intent (with help) to make it stable. |
hi 👋 does this PR include breaking API changes? can we still |
yes, it shall be compatible 🦩 |
@ddelange, no breaking changes :] |
One breaking change :] The following code: class Foo(LightningModule):
def __init__(self):
self.metric = MeanAveragePrecision()
def on_validation_epoch_end(self):
self.log_dict(self.metric.compute()) used to work in 0.11.4. However, this PR adds a new
Here, |
That's not this PR, it's a breaking change that was introduced in v1.0.0.rc0 ref ac64e63 |
Good catch, thanks! |
@adamjstewart @ddelange For now you can also get around by a simple subclass: from torchmetrics.detection.mean_ap import MeanAveragePrecision as MAP
class MeanAveragePrecision(MAP):
def compute(self):
res = super().compute()
res.pop("classes")
return res |
An argument would be good. I would vote for it to default to False so that the above example works out-of-the-box. I can submit a PR if you'd like. |
Please feel free to open a PR. I would prefer if we set the default to be |
If it defaults to True then the above code will continue to not work. The purpose of a PR would be to restore backwards compatibility, not to double down on the change. |
What does this PR do?
Fixes #1024
Fixes #1164
Reverts the
MeanAveragePrecision
implementation back to its v0.6 version for two primary reasons:pycocotools
during the major calculations and we "just" provide an interfaceThe consequence for now is that
pycocotools
will be (re-)introduced as an require dependency for the metric.TODO list:
iou_type
(introduced after v0.6)iou_thresholds, rec_thresholds, max_detection_thresholds
(introduced after v0.6)pycocotools
directly during testing to check if output is correctCheck if implementation fixes Order of empty images in the batch changes the mAP results #1774Check if implementation fixes Over-estimated map value #1793Check if implementation fixes Wrong Calculation of Mean Average Precision #1184Benchmark results below (measured on my laptop)
As it been stated many times the old implementation is orders of magnitude faster than our current implementation.
Benchmark code
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃
📚 Documentation preview 📚: https://torchmetrics--1832.org.readthedocs.build/en/1832/